en
AI Ranking
每月不到10元,就可以无限制地访问最好的AIbase。立即成为会员
Home
News
Daily Brief
Income Guide
Tutorial
Tools Directory
Product Library
en
AI Ranking
Search AI Products and News
Explore worldwide AI information, discover new AI opportunities
AI News
AI Tools
AI Cases
AI Tutorial
Type :
AI News
AI Tools
AI Cases
AI Tutorial
2024-12-26 10:54:51
.
AIbase
.
14.3k
China Telecom's Xingchen Model Selected as One of the 'National Treasures' of the Year
In the 'Top Ten National Treasures' annual selection initiated by the State-owned Assets Supervision and Administration Commission of the State Council, China Telecom's self-developed Xingchen Model made the list thanks to its groundbreaking technological achievements. As the first full-size, full-modal, and domestically produced foundational model system in the country, the Xingchen Model demonstrates exceptional capabilities in semantics, speech, vision, and multimodal fields. In the semantic domain, the Xingchen Model has achieved significant breakthroughs. Leveraging a fully domestic 10,000-card cluster and training framework, this model reaches over 93% of the computational efficiency of NVIDIA's equivalent power, with a training duration ratio that is more efficient.
2024-06-17 10:43:41
.
AIbase
.
9.6k
The Chinese University of Hong Kong Proposes a Full Modal Pre-training Paradigm MiCo Simulating Human Brain Cognition Process
A research team from The Chinese University of Hong Kong and the Chinese Academy of Sciences has proposed a full modal pre-training paradigm called MiCo (Multimodal Context). This method has achieved significant results in the field of multimodal learning, setting new state-of-the-art (SOTA) records in 37 categories.